The Impact of Unlabeled Patterns in Rademacher Complexity Theory for Kernel Classifiers
نویسندگان
چکیده
We derive here new generalization bounds, based on Rademacher Complexity theory, for model selection and error estimation of linear (kernel) classifiers, which exploit the availability of unlabeled samples. In particular, two results are obtained: the first one shows that, using the unlabeled samples, the confidence term of the conventional bound can be reduced by a factor of three; the second one shows that the unlabeled samples can be used to obtain much tighter bounds, by building localized versions of the hypothesis class containing the optimal classifier.
منابع مشابه
A Note on Improved Loss Bounds for Multiple Kernel Learning
The paper [5] presented a bound on the generalisation error of classifiers learned through multiple kernel learning. The bound has (an improved) additive dependence on the number of kernels (with the same logarithmic dependence on this number). However, parts of the proof were incorrectly presented in that paper. This note remedies this weakness by restating the problem and giving a detailed pr...
متن کاملStructural Risk Minimization and Rademacher Complexity for Regression
The Structural Risk Minimization principle allows estimating the generalization ability of a learned hypothesis by measuring the complexity of the entire hypothesis class. Two of the most recent and effective complexity measures are the Rademacher Complexity and the Maximal Discrepancy, which have been applied to the derivation of generalization bounds for kernel classifiers. In this work, we e...
متن کاملThe Rademacher Complexity of Co-Regularized Kernel Classes
In the multi-view approach to semisupervised learning, we choose one predictor from each of multiple hypothesis classes, and we co-regularize our choices by penalizing disagreement among the predictors on the unlabeled data. We examine the co-regularization method used in the coregularized least squares (CoRLS) algorithm [12], in which the views are reproducing kernel Hilbert spaces (RKHS's), a...
متن کاملBounds for Learning the Kernel: Rademacher Chaos Complexity
In this paper we develop a novel probabilistic generalization bound for regularized kernel learning algorithms. First, we show that generalization analysis of kernel learning algorithms reduces to investigation of the suprema of homogeneous Rademacher chaos process of order two over candidate kernels, which we refer to it as Rademacher chaos complexity. Our new methodology is based on the princ...
متن کاملRademacher Chaos Complexities for Learning the Kernel Problem
We develop a novel generalization bound for learning the kernel problem. First, we show that the generalization analysis of the kernel learning problem reduces to investigation of the suprema of the Rademacher chaos process of order 2 over candidate kernels, which we refer to as Rademacher chaos complexity. Next, we show how to estimate the empirical Rademacher chaos complexity by well-establis...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011